Enhancing Factorization Machines With Generalized Metric Learning
نویسندگان
چکیده
Factorization Machines (FMs) are effective in incorporating side information to overcome the cold-start and data sparsity problems recommender systems. Traditional FMs adopt inner product model second-order interactions between different attributes, which represented via feature vectors. The problem is that violates triangle inequality property of As a result, it cannot well capture fine-grained attribute interactions, resulting sub-optimal performance. Recently, euclidean distance exploited replace has delivered better However, previous FM methods including ones equipped with all focus on attribute-level interaction modeling, ignoring critical intrinsic correlations inside attributes. Thereby, they fail complex rich exhibited real-world data. To tackle this problem, paper, we propose framework generalized metric learning techniques these correlations. In particular, based framework, present Mahalanobis deep neural network (DNN) methods, can effectively linear non-linear features, respectively. Besides, design an efficient approach for simplifying functions. Experiments several benchmark datasets demonstrate our proposed outperforms state-of-the-art baselines by large margin. Moreover, collect new large-scale dataset second-hand trading justify effectiveness method over
منابع مشابه
Distance Metric Learning for Kernel Machines
Recent work in metric learning has significantly improved the state-of-the-art in k-nearest neighbor classification. Support vector machines (SVM), particularly with RBF kernels, are amongst the most popular classification algorithms that uses distance metrics to compare examples. This paper provides an empirical analysis of the efficacy of three of the most popular Mahalanobis metric learning ...
متن کاملAn Efficient Alternating Newton Method for Learning Factorization Machines
Recently, factorization machines (FM) have emerged as a powerful model in many applications. In this work, we study the training of FM with the logistic loss for binary classification, which is a non-linear extension of the linear model with the logistic loss (i.e., logistic regression). For the training of large-scale logistic regression, Newton methods have been shown to be an effective appro...
متن کاملCross-Domain Collaborative Filtering with Factorization Machines
Factorization machines offer an advantage over other existing collaborative filtering approaches to recommendation. They make it possible to work with any auxiliary information that can be encoded as a real-valued feature vector as a supplement to the information in the user-item matrix. We build on the assumption that different patterns characterize the way that users interact with (i.e., rate...
متن کاملPersonalized ranking with pairwise Factorization Machines
Pairwise learning is a vital technique for personalized ranking with implicit feedback. Given the assumption that each user is more interested in items which have been previously selected by the user than the remaining ones, pairwise learning algorithms can well learn users’ preference, from not only the observed user feedbacks but also the underlying interactions between users and items. Howev...
متن کاملHigher-Order Factorization Machines
Factorization machines (FMs) are a supervised learning approach that can use second-order feature combinations even when the data is very high-dimensional. Unfortunately, despite increasing interest in FMs, there exists to date no efficient training algorithm for higher-order FMs (HOFMs). In this paper, we present the first generic yet efficient algorithms for training arbitrary-order HOFMs. We...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Transactions on Knowledge and Data Engineering
سال: 2022
ISSN: ['1558-2191', '1041-4347', '2326-3865']
DOI: https://doi.org/10.1109/tkde.2020.3034613